Approximating a Gram Matrix for Improved Kernel-Based Learning
نویسندگان
چکیده
A problem for many kernel-based methods is that the amount of computation required to find the solution scales as O(n), where n is the number of training examples. We develop and analyze an algorithm to compute an easily-interpretable low-rank approximation to an n × n Gram matrix G such that computations of interest may be performed more rapidly. The approximation is of the form G̃k = CW + k C T , where C is a matrix consisting of a small number c of columns of G and Wk is the best rank-k approximation to W , the matrix formed by the intersection between those c columns of G and the corresponding c rows of G. An important aspect of the algorithm is the probability distribution used to randomly sample the columns; we will use a judiciously-chosen and data-dependent nonuniform probability distribution. Let ‖·‖2 and ‖·‖F denote the spectral norm and the Frobenius norm, respectively, of a matrix, and let Gk be the best rank-k approximation to G. We prove that by choosing O(k/ ) columns ∥ ∥ ∥G− CW k C ∥ ∥ ∥ ξ ≤ ‖G−Gk‖ξ + n ∑
منابع مشابه
On the Nyström Method for Approximating a Gram Matrix for Improved Kernel-Based Learning
A problem for many kernel-based methods is that the amount of computation required to find the solution scales as O(n3), where n is the number of training examples. We develop and analyze an algorithm to compute an easily-interpretable low-rank approximation to an n× n Gram matrix G such that computations of interest may be performed more rapidly. The approximation is of the form G̃k = CW + k C ...
متن کاملBayesian Nonparametric Kernel-Learning
Kernel methods are ubiquitous tools in machine learning. However, there is often little reason for the common practice of selecting a kernel a priori. Even if a universal approximating kernel is selected, the quality of the finite sample estimator may be greatly affected by the choice of kernel. Furthermore, when directly applying kernel methods, one typically needs to compute a N×N Gram matrix...
متن کاملNeural Network-Based Learning Kernel for Automatic Segmentation of Multiple Sclerosis Lesions on Magnetic Resonance Images
Background: Multiple Sclerosis (MS) is a degenerative disease of central nervous system. MS patients have some dead tissues in their brains called MS lesions. MRI is an imaging technique sensitive to soft tissues such as brain that shows MS lesions as hyper-intense or hypo-intense signals. Since manual segmentation of these lesions is a laborious and time consuming task, automatic segmentation ...
متن کاملCAS WAVELET METHOD FOR THE NUMERICAL SOLUTION OF BOUNDARY INTEGRAL EQUATIONS WITH LOGARITHMIC SINGULAR KERNELS
In this paper, we present a computational method for solving boundary integral equations with loga-rithmic singular kernels which occur as reformulations of a boundary value problem for the Laplacian equation. Themethod is based on the use of the Galerkin method with CAS wavelets constructed on the unit interval as basis.This approach utilizes the non-uniform Gauss-Legendre quadrature rule for ...
متن کاملAn infeasible interior-point method for the $P*$-matrix linear complementarity problem based on a trigonometric kernel function with full-Newton step
An infeasible interior-point algorithm for solving the$P_*$-matrix linear complementarity problem based on a kernelfunction with trigonometric barrier term is analyzed. Each (main)iteration of the algorithm consists of a feasibility step andseveral centrality steps, whose feasibility step is induced by atrigonometric kernel function. The complexity result coincides withthe best result for infea...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 2005